专利摘要:
The invention relates to a system for characterizing an environment of a vehicle, the system comprising the following: a projection means (210) adapted to project a pattern of laser light into a series of pulses in the direction of the environment; a detector (220) comprising a plurality of pixels, the detector (220) being configured to detect light representing the laser light pattern as reflected by the environment in sun synchronization with the sequence of pulses; and a processing means (240) configured to calculate distances to nearby objects (99) as a function of exposure values generated by the pixels in response to the fed-detected light; wherein the detector (220) is further configured to detect light forming a two-dimensional image of the environment at times that do not coincide with the sequence of pulses or at pixels representing the light representing the pattern of laser light as reflected by the environment , not received.
公开号:BE1025547B1
申请号:E2018/4999
申请日:2018-01-02
公开日:2019-04-09
发明作者:Filip Geuens
申请人:Xenomatix Nv;
IPC主号:
专利说明:

BE2018 / 4999 System for characterizing the environment of a vehicle
FIELD OF THE INVENTION
The present invention relates to the field of systems for determining a distance to an object, in particular to observation systems to be used for the characterization of an area or a part thereof, such as can be used to remove obstacles in the object. detect proximity to a vehicle.
Background
In the field of distance proficiency technology, in particular with regard to the generation of high resolution maps of an environment that can be used in control and navigation applications including automotive and industrial applications, game applications, and mapping applications, it is known to use triangulation-based and flight-time based observation. to determine the distance of objects from a sensor.
A highly accurate mid-range environment qualification system for vehicles using triangulation is known from the international patent application publication WO 2015/004213 A1 in the name of the present applicant. In that patent application, the location of objects is based on the projection of pulsed radiation spots and the analysis of the displacement of detected spots with respect to the predetermined reference spot positions. More specifically, the cited patent application system uses triangulation. However, the accuracy that can be achieved correlates with the triangulation basis, which limits the miniaturization that can be achieved.
Flight-time based techniques include the use of RF-modulated sources, range-terminated imagers, and
2018/4999
-2BE2018 / 4999 direct flight time ("direct time-of-flight", DToF) image formers. For the use of RF-modulated sources and range-closed image formers, it is necessary to illuminate the entire area of interest with a modulated or pulsed source. Direct flight time systems, like most LIDARs, mechanically scan the area of interest with a pulsed beam, the reflection of which is observed with a pulse detector. The optical power emitted by the present semiconductor lasers cannot meet the power requirements necessary for operation in the known LIDAR systems to be of practical use in automotive applications (e.g., for ranges up to 250 m). Unpublished European Patent Application No. EP 15 191 288.8, in the name of the present applicant, describes a system for determining a distance to an object that overcomes such limitations. It comprises the following: a solid-state light source adapted to project a pattern of individual spots of laser light in the direction of the object in a series of pulses; a detector comprising a plurality of pixels, the detector configured to detect light representing the pattern of individual spots as reflected by the object in synchronization with the series of pulses; and a processing means configured to calculate the distance to the object as a function of exposure values generated by the pixels in response to the detected light. The pixels are configured to generate the exposure values by accumulating, for each pulse of the series, a first amount of electrical charge representative of a first amount of light reflected by the object during a first predetermined time window and a second electrical charge representative of a second amount of light reflected by the object during a second predetermined time window, the second predetermined time window occurring after the first predetermined time window.
Given the increasing reliance on remote qualification technology to provide vehicle safety aspects, advanced driver assistance systems (ADAS), and self-contained (or "self-driving") cars, there is a need for
2018/4999
-3BE2018 / 4999 system that more accurately characterizes the environment of the vehicle in which the system is mounted.
Summary of the invention
According to an aspect of the present invention, there is provided a system for characterizing the environment of a vehicle, the system comprising the following:
- a projection means adapted to project a pattern of laser light into a series of pulses in the direction of the environment;
a detector comprising a plurality of pixels, the detector being configured to detect light representing the pattern of laser light as reflected by the environment in synchronization with the series of pulses; and
- a processing means configured to calculate distances to objects in the environment as a function of exposure values generated by the pixels in response to the detected light;
the detector being further configured to detect light forming a two-dimensional image of the environment at times that do not coincide with the sequence of pulses or at pixels that do not receive the light representing the pattern of laser light as reflected by the environment .
It is an advantage of the invention that the characterization of the environment can be improved by combining 2D information and 3D information obtained from the same sensor at passed times. The 3D information is obtained from the system when it is used as a range sensor, by recording the reflected light patterns in synchronization with their projection; and the 2D information is obtained from the system when it is used as a digital camera between the reach rate pulses.
The projection means preferably comprises in solid state laser; it may in particular be a VCSEL array or a solid state laser provided with an adequate grid,
2018/4999
-4BE2018 / 4999. The laser light pattern is preferably a pattern of spots, preferably individual spots. The series of pulses can be repeated periodically to enable continuous updating of the characterization of the environment.
In an embodiment of the system according to the present invention, the pixels are configured to generate the exposure values by accumulating, for all pulses of the series, a first amount of electrical charge representative of a first amount of light that occurred during a first advance determined time window is reflected by the objects and a second electrical charge representative of a second amount of light reflected by the objects during a second predetermined time window, the second predetermined time window occurring after the first predetermined time window.
It is an advantage of this embodiment that it uses distance-limited LIDAR techniques to obtain accurate distance information in a small form factor.
In an embodiment of the system according to the present invention, the processing means is adapted to determine the distances by determining a displacement of characteristics of the detected light representing the pattern of laser light as reflected with respect to predetermined characteristic positions by the surroundings.
In this embodiment, the respective exposure values of the different pixels of the detector are analyzed to determine in which pixels reflections of the projected pattern (e.g., spots) are detectable, and in which they are not. In this way, the displacement of parts of the reflected pattern relative to the predetermined positions (e.g., spot positions) can be determined, yielding information about the distance of the objects that have reflected the respective parts of the projected pattern. It is an advantage of this embodiment that it is known
2018/4999
-5BE2018 / 4999 triangulation-based techniques used to obtain accurate distance information.
In one embodiment, the system of the present invention comprises a lighting means configured to project a light beam onto the environment at the times that do not coincide with the sequence of pulses.
The light beam is intended for homogeneous illumination of the relevant part of the environment, which preferably coincides with the field of view of the detector, so, unlike the projected pattern used for distance determination, it must be substantially homogeneous. It is an advantage of this embodiment that the environment to be included in the 2D image can be adequately illuminated even when the ambient light condition is unfavorable (e.g., at night).
In a particular embodiment, the projection means and the lighting means share a common light source.
It is an advantage of this special embodiment that the system can be kept compact by avoiding the need for multiple light sources.
In a more particular embodiment, the common light source comprises a VCSEL array, and the illumination means further comprises an actively controlled diffuser, which is configured to be activated at the times that do not coincide with the sequence of pulses, to diffuse light so as to originate is from the VCSEL array to form the bundle.
It is an advantage of this more special embodiment that a VCSEL array, which is very suitable for projecting structured light, can also be used to provide the lighting required for 2D image recording.
2018/4999
- 6 BE2018 / 4999
In an embodiment of the system according to the present invention, the plurality of pixels of the detector are provided with a time-dependent filter that allows light in different wavelength bands to reach the plurality of pixels at different times.
It is an advantage of this embodiment that an RGB image can be produced by combining three different 2D exposures that are shifted slightly over time. An RGB image provides more accurate automated feature recognition and is generally more suitable for visual reproduction for a human user than a monochrome image.
In an embodiment of the system according to the present invention, the different pixels of the plurality of pixels of the detector are provided with different filters that allow light in different wavelength bands to reach the different pixels at the times.
It is an advantage of this embodiment that an RGB image can be produced by combining exposure values obtained on pixels that are slightly shifted in space. An RGB image provides more accurate automated feature recognition and is generally more suitable for visual reproduction for a human user than a monochrome image.
According to an aspect of the present invention, there is provided a vehicle comprising the system of any one of the preceding claims which is arranged to characterize an area surrounding the vehicle.
The present invention is very suitable for use in vehicles, in particular road vehicles such as automobiles. The system can therefore contribute to vehicle safety aspects, advanced driver assistance systems (ADAS), and even autonomous (or "self-driving") cars. Since sensors are in
2018/4999
-2018 / 4999 automotive applications compete for space, it is an additional advantage of the present invention that it combines a multitude of functions in the same sensor, thereby providing 2D and 3D sensor fusion without needing an additional external 2D Sensor.
Brief description of the figures
These and other aspects and advantages of the present invention will now be described in more detail with reference to the accompanying drawings, in which:
Figure 1 schematically illustrates an embodiment of the system according to the present invention;
Figures 2-4 illustrate the operating principles of a LIDAR-based embodiment of the system according to the present invention;
Figure 5 presents various time diagrams that can be used in embodiments of the present invention;
Figure 6 schematically illustrates a pixel device for use in an embodiment of the system of the present invention; and
Figure 7 schematically illustrates time diagrams that can be used in embodiments of the present invention.
Detailed description of the embodiments
Figure 1 schematically illustrates an embodiment of the system according to the present invention.
The system is intended and adapted to characterize the environment of a vehicle. The environment may include the area in front of the vehicle with
2018/4999
-8BE2018 / 4999 including the road surface for the vehicle; similarly, it may include the area behind the vehicle (in particular, when the vehicle deteriorates); it may include some space in the vicinity of the vehicle where obstacles or other road users may be present. The environment of the vehicle can extend to a distance of meters, tens of meters, or even hundreds of meters. The characterization is carried out by determining distances to objects in the environment. Although distance is a fundamental measurement parameter, estimates of derived variables such as speed (including the direction of movement) and acceleration of the detected objects can be determined based on multiple distance measurements at different times. For complex objects, which can be recognized by combining information from several spatially diverse measuring points, additional parameters such as orientation and rotation can also be derived. Since all measurements are relative to the sensor, measurements of "fixed" objects can also provide information about the speed, acceleration, orientation, (slope, rotation, tilt), and rotation of the sensor, and thus of the vehicle to which it is mounted is.
The system comprises a projection means 210 adapted to project a pattern of laser light in the direction of the environment in a series of pulses; a detector 220 comprising a plurality of pixels, the detector 220 being configured to detect light representing the pattern of laser light as reflected by the environment in synchronization with the series of pulses; and a processing means 240 configured to calculate distances to objects 99 in the environment as a function of exposure values generated by the pixels in response to the detected light.
If the system operates on the flight time (LIDAR) principle, the pixels 220 may be configured to generate the exposure values by accumulating, for all pulses of the series, a first amount of electrical charge representative of a first amount of light which is reflected by the objects 99 and one during a first predetermined time window 10
2018/4999
-9BE2018 / 4999 second electric charge representative of a second amount of light reflected by the objects during a second predetermined time window 20, the second predetermined time window 20 occurring after the first predetermined time window 10.
The operating principle of such a LIDAR-based system is illustrated by the time diagrams in Figures 2-4. For the sake of clarity, only a single pulse of the pulse train is periodically illustrated repeatedly, which consists of a first time window 10 and a second time window 20. The principle is described, by way of example, with reference to a projection means comprising a solid-state light source .
As can be seen in Figure 2, during the first time window 10, the solid-state light source 210 is, in its "ON" state, the pattern of light spots radiating to the environment. During the second time window 20, the solid-state light source 210 is in its "OFF" state.
The arrival of the reflected light on the detector 220 is delayed from the start of the projection by a time amount proportional to the distance traveled (approximately 3.3 ns / m in free space). Because of this delay, only a portion of the reflected light will be detected at the first tube 221 of the detector 220, which is only activated during the first time window 10. Thus, the charge accumulated in the first tube exists during its period of activation (the first time window 10) from a portion that only represents the noise and ambient light that strikes the pixel prior to the arrival of the reflected pulse, and from a portion that represents the noise, the ambient light, and the leading edge of the reflected pulse .
The last part of the reflected pulse will be detected at the second sleeve 222 of the detector 220, which is only activated during the second time window 20, which preferably immediately follows the first time window 10. Thus, the charge that is accumulated in the second tube during her
2018/4999
-10BE2018 / 4999 period of activation (the second time window 20) from a part that represents the noise, the ambient light, and the trailing edge of the reflected pulse and from a part that only represents the noise and the ambient light that hits the pixel prior to the arrival of the reflected pulse.
The greater the distance between the reflective object 99 and the system 200, the smaller the portion of the pulse that will be detected in the first tube 221 and the greater the portion of the pulse that will be detected in the second tube 222.
If the leading edge of the reflected pulse arrives after the closing of the first sleeve 221 (ie, after the end of the first time window 10), the portion of the reflected pulse that can be detected in the second sleeve 222 will again become smaller with increased flight time delay .
The resulting amounts of charge A, B in each of the respective sleeves 221, 222 for varying distances from the object 99 are shown in Figure 3b. In order to simplify the representation, the diagram does not take account of the effect of the reduction of light with distance according to the reverse squared law. It is clear that for the flight time delays up to and including the combined duration of the first time window 10 and the second time window 20, the flight time delay can in principle be unambiguously derived from the values of A and B:
- For flight time delays up to and including the duration of the first time window
10, B is proportional to the distance of the object 99. To easily achieve an absolute distance determination, the normalized value B / (B + A) can be used, with some effect of non-perfect reflectivity of the detected object and is removed from the reverse law of squares.
- For flight time delays exceeding the duration of the first time window 10, A consists only of daylight and noise contributions (not illustrated), and C-B is substantially proportional (after correction for the reverse squares law) to the distance of the object 99, where C is a
2018/4999
- 11 BE2018 / 4999 offset value.
Although Figures 2 and 3 illustrate the principle of the invention with respect to a single pulse radiated in the time window 10, it will be understood that the illustrated pulse is part of a series of pulses as defined above. Figure 4 schematically illustrates exemplary time features of such a series. As illustrated, the illumination scheme 40 consists of a repeated transmission of a series 30 of individual pulses 10. The width of the individual pulses 10 is determined by the maximum operating range. The entire sequence can be repeated at a frequency of, for example, 60 Hz.
Several optional features of a flight time-based observation system are described in the unpublished European Patent Application No. EP 15 191 288.8, in the name of the present applicant, the content of which is included by this reference for the purpose of allowing the person skilled in the art to incorporate these features in embodiments of the present invention.
If the system operates according to the triangulation principle, the processing means 240 may be adapted to determine the distances by determining a displacement of characteristics of the detected light representing the pattern of laser light as reflected by the environment with respect to predetermined characteristic positions. Preferably, the projected pattern is a pattern of spots of laser light, and the distances are determined by determining one of a displacement of detected spots that represent the projected spots as reflected by objects in the environment, with respect to predetermined spot positions .
Various optional features of a triangulation-based observation system are described in the international patent application publication WO 2015/004213 A1 in the name of the present applicant, the content of which is included by this reference for the purpose of
2018/4999
-12BE2018 / 4999 to enable the person skilled in the art to incorporate these features into embodiments of the present invention.
According to the invention, the detector 220 is further configured to detect light that forms a two-dimensional image of the environment at times that do not coincide with the sequence of pulses or at pixels that do not receive the light representing the pattern of laser light as reflected by the environment.
Given the accuracy needs of vehicle spacing systems, typical CMOS sensor arrays are selected that have a total array size of the order of 1 megapixel. The inventors have found that although a relatively coarse size of the pixels is used in such sensors, in the order of 10 μm, the combined sensor is capable of producing a surprisingly acceptable 2D image quality when used in combination with adequate optics to form a rudimentary digital camera.
The invention, and in particular the concept of detecting light that forms a two-dimensional image of the environment at times that do not coincide with the series of pulses, is based inter alia on the insight of the inventors that in the time intervals between the spacing pulses, or between ranges of distance determination pulses, the pixel array can be used to record digital 2D images of the environment. In this way, two different functions, which provide additional information, can be combined in a single sensor.
The invention, and in particular the concept of detecting light forming a two-dimensional image of the environment on pixels that does not receive the light representing the pattern of laser light as reflected by the environment, is further based on the insight from the inventors that since distance determination systems rely on the detection of reflections from specific light patterns, such as line patterns or spot patterns, only a small subset of the total number of pixels is actually used at any given time. This
2018/4999
-13BE2018 / 4999 specific concept can be used when sufficient ambient light reaches the detector when the distance function (projection and detection of a light pattern) is active, so that the pixels that do not receive a part of the projected pattern can form an image of the received light.
Although the pattern reflection based distance sensing provides a depth map that contains 3-dimensional information only at the points exposed by the projected pattern of laser light, the 2D images recorded in between provide a visual snapshot of the entire environment. The depth map can be registered in the 2D images, and depth information can be obtained for each pixel in the 2D images by interpolating the depth map values. Since the 2D images and the 3D information are obtained from the same sensor, there is no parallax between the different images that facilitates registration.
Preferably, the interpolation of the depth map values is assisted by the pixel values in the 2D images. Thus, for example, an area between different depth map values corresponding to a regular illumination or color gradient in the 2D image can be interpolated in the depth dimension by linear interpolation. An area between different depth map values that includes an abrupt step in the illumination or color value can be interpolated by a stepwise constant depth value, the step in the depth value being made to coincide with the step in the illumination or color value.
The combined 2D / 3D image combines more information about the environment that each of the sources taken separately does. A machine field of vision system can be provided with the combined 2D / 3D image to detect relevant features in the environment, such as pedestrians, vehicles, fixed objects, debris, some inequality in the road surface, and the like.
The 2D images are preferably included in time slots that occur
2018/4999
- 14BE2018 / 4999 between frames, for example, between sets of pulses. Exemplary time charts are shown in Figure 5. In diagram a (not according to the invention), 5 consecutive frames (each representing a series of pulses, for example a series 30 of Figure 2) are used for 3D distance observation. In diagram d, one frame of the first four frames has been replaced by a 2D image recording time frame of exactly the same duration as the other frames, therefore, the total cadence of the frames remains the same, but only 75% of the nzme time slots are used for distance qualification (the person skilled in the art will realize that this pattern can be varied according to the requirements of the application; for example, the relative time reserved for distance qualification can be 10%, 20%, 25%, 33%, 40%, 50%, 60% , 67%, 75%, 80%, etc.). In diagrams b and c, the time used for 2D image recording is made longer and shorter than the distance perception / rames, respectively.
Generally, the time used for 2D image recording can be selected as a function of the desired exposure time, which must be long enough to accumulate a sufficient amount of light in the pixels and be sufficiently short for motion blur (when the sensor and / or objects in the environment are in motion). The time slots used for 2D image recording can also be extended to allow consecutive recording of light in different wavelength bands (e.g., red, green, and blue light to generate an RGB image), as explained below with reference to Figure 7, diagram b.
As explained in great detail in the above-mentioned patent applications WO 2015/004213 A1 and EP 15 191 288.8, different measures must be taken to filter ambient light (in particular sunlight) from the signal that reaches the sensor, so that the light that reaches the sensor achieved is essentially limited to the desired reflection of the projected light pattern, in order to ensure accurate and long-distance distance recognition. These measures include the use of a narrowband pass filter, and optics that guide the incoming reflected light onto a path that is substantially perpendicular to the narrowband pass filter.
2018/4999
-15BE2018 / 4999
The measurements required to optimize the remote capability are limiting the usability of the pixel array as a 2D imaging camera. Since the sensors are mounted on a vehicle that, when in use, moves normally, any increase in exposure time to the total amount of light included in a single exposure is limited by the acceptable amount of motion blur - in practice is this a serious limitation.
In view of these negative effects, the inventors have found that the following optional features lead to a better performance in the 2D image recording.
A first solution consists of the use of a sensor band-pass filter that can be electronically or electromechanically controlled to be active in synchronization with the pulse sequence and to be inactive at the times when the 2D images are recorded
A second solution consists in providing the system with an illumination means configured to provide a beam of light on the environment at the points in time that do not coincide with the sequence of pulses. Such an illumination means can provide a flash of illumination, as is often done with conventional photography in low-light conditions. The projection means 210 and the illumination means may share a common light source, in particular a VCSEL array. This ensures that a "flash" can now be emitted in the same wavelength band for which the narrowband pass filter is optimized, thereby maximizing the amount of light that will reach the pixel array. If the common light source is a VCSEL array, the lighting means may further comprise an actively controlled diffuser, which is configured to be activated at the times when the 2D images are taken (which do not coincide with the sequence of pulses), so as to be light to distribute the VCSEL array to form the desired bundle of light (instead of a series of spots). The illuminant may also be integrated with the headlamp assembly,
2018/4999
-16BE2018 / 4999 wherein the light source must be selected to provide sufficient light power in the part of the spectrum that the narrowband pass filter can pass at the sensor side.
In view of the above considerations, one particularly advantageous embodiment of the system of the present invention operates according to the flight time-based principle described above, wherein a VCSEL array is provided to project a pulsed pattern of discrete spots onto the environment to be characterized, wherein its reflections are detected by a CMOS-based sensor array equipped with a narrow-band pass filter configured to pass essentially only the wavelength transmitted by the VCSEL array. A separate flash light emitting a beam of light of the same wavelength (typically in a narrow wavelength range around 860 nm) is provided to illuminate the environment at the times when the 2D images are to be recorded - by design, the light will The light emitted by the flash light may also be able to pass the narrowband pass filter at the detector. Preferably, four out of five time slots are used to project / detect spacing frames and the remaining one of every five time slots is used to record 2D images.
In embodiments of the present invention, different pixels of the plurality of pixels of the detector 220 are provided with different filters that allow light in different wavelength bands to reach the different pixels at the times. This is schematically illustrated in Figure 6, where a pixel array is displayed with different pixels (or different tubes of a pixel), which are provided with respective filters for near infrared ("near infrared", NIR) light, red light, green light , and blue light.
In other embodiments of the present invention, the plurality of pixels of the detector 220 are provided with a time-dependent filter that allows light in different wavelength bands to reach the plurality of pixels at different times. This is shown schematically in Figure 7, diagram
2018/4999
-17BE2018 / 4999a, where separate time slots are provided for receiving red, green, and blue light; and in Figure 7, diagram b, where the time slot is extended to allow the successive recording of red, green, and blue light in successive portions of the same time slot.
These sets of embodiments make it possible to record an RGB 2D image that is suitable for display on a color display.
The present invention also relates to a vehicle (in particular a road vehicle or a rail vehicle) which comprises the system according to one of the preceding claims, which is adapted to characterize an area surrounding the vehicle.
Although the invention has been described above with reference to individual system and method embodiments, this was only done for clarity reasons. Those skilled in the art will appreciate that features described in connection with the system or method alone may also be applied to the method or system, respectively, with the same technical effects and advantages. Furthermore, the scope of the invention is not limited to these embodiments, but is defined by the appended claims.
权利要求:
Claims (9)
[1]
BE2018 / 4999 Conclusions
A system for characterizing an environment of a vehicle, the system comprising:
- a projection means (210) adapted to project a pattern of laser light into a series of pulses in the direction of the environment;
- a detector (220) comprising a plurality of pixels, the detector (220) being configured to detect light representing the laser light pattern as reflected by the environment in synchronization with the series of pulses; and
- a processing means (240) configured to calculate distances to objects (99) in the environment as a function of exposure values generated by the pixels in response to the detected light;
the detector (220) is further configured to detect light forming a two-dimensional image of the environment at times that do not coincide with the sequence of pulses or at pixels representing the light representing the laser light pattern as reflected by the environment , not received.
[2]
The system of claim 1, wherein the pixels (220) are configured to generate the exposure values by accumulating, for all the pulses of the series, a first amount of electrical charge representative of a first amount of light that was emitted during a first advance determined time window (10) is reflected by the objects (99), and a second electrical charge representative of a second amount of light reflected by the objects during a second predetermined time window (20), the second predetermined time window ( 20) occurs after the first predetermined time window (10).
[3]
The system of claim 1, wherein the processing means (240) is adapted to determine the distances by determining a displacement of characteristics of the detected light representing the laser light pattern as reflected by the environment with respect to
2018/4999
-19BE2018 / 4999 predefined feature positions.
[4]
A system according to any one of the preceding claims, comprising an illumination means configured to project a light beam onto the environment at the times that do not coincide with the sequence of pulses.
[5]
The system of claim 4, wherein the projection means (210) and the illumination means share a common light source.
[6]
The system of claim 5, wherein the common light source comprises a VCSEL array, and wherein the illumination means further comprises an actively controlled diffuser configured to be activated at times not coinciding with the sequence of pulses, so as to be light. distribute from the VCSEL array to form the bundle.
[7]
The system of any one of the preceding claims, wherein the plurality of pixels of the detector (220) are provided with a time-dependent filter that allows light in different wavelength bands to reach the plurality of pixels at different times.
[8]
The system of any preceding claim, wherein different pixels of the plurality of pixels of the detector (220) are provided with different filters that allow light in different wavelength bands to reach the different pixels at the times.
[9]
A vehicle comprising the system according to any of the preceding claims, which is arranged to characterize an area surrounding the vehicle.
类似技术:
公开号 | 公开日 | 专利标题
BE1025547B1|2019-04-09|System for characterizing the environment of a vehicle
BE1023788B1|2017-07-26|System and method for determining the distance to an object
US10183541B2|2019-01-22|Surround sensing system with telecentric optics
JP6387407B2|2018-09-05|Perimeter detection system
JP2018531374A6|2018-12-13|System and method for measuring distance to an object
JP2019521314A|2019-07-25|Three-dimensional imaging system
EP3519860B1|2020-09-09|System and method for determining a distance to an object
JP2019508717A|2019-03-28|Method and apparatus for active pulsed 4D camera for image acquisition and analysis
EP3519855B1|2020-09-09|System for determining a distance to an object
EP3301479A1|2018-04-04|Method for subtracting background light from an exposure value of a pixel in an imaging array, and pixel for use in same
EP3519856B1|2020-09-09|System for determining a distance to an object
JP2021507218A|2021-02-22|Systems and methods for measuring distances to objects
EP2824418A1|2015-01-14|Surround sensing system
JP2022505772A|2022-01-14|Time-of-flight sensor with structured light illumination
EP3227721B1|2018-09-12|Distance measuring device and method for determining a distance
EP3543742A1|2019-09-25|A 3d imaging system and method of 3d imaging
JP7028878B2|2022-03-02|A system for measuring the distance to an object
同族专利:
公开号 | 公开日
KR20190098242A|2019-08-21|
EP3563177B1|2020-11-18|
IL266025D0|2019-06-30|
JP2020515811A|2020-05-28|
EP3343246A1|2018-07-04|
EP3796046A1|2021-03-24|
US20190310376A1|2019-10-10|
WO2018122415A1|2018-07-05|
CN110121659A|2019-08-13|
IL266025A|2019-12-31|
BE1025547A1|2019-04-03|
EP3563177A1|2019-11-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

EP1118208B1|1998-09-28|2004-11-10|3DV Systems Ltd.|Measuring distances with a camera|
JP4405154B2|2001-04-04|2010-01-27|インストロプレシジョンリミテッド|Imaging system and method for acquiring an image of an object|
BE1021971B1|2013-07-09|2016-01-29|Xenomatix Nv|ENVIRONMENTAL SENSOR SYSTEM|
GB201407267D0|2014-04-24|2014-06-11|Cathx Res Ltd|Underwater surveys|
US9753140B2|2014-05-05|2017-09-05|Raytheon Company|Methods and apparatus for imaging in scattering environments|
EP3074721B1|2014-08-08|2021-05-19|CEMB S.p.A.|Vehicle equipment with scanning system for contactless measurement|
JP6478725B2|2015-03-09|2019-03-06|キヤノン株式会社|Measuring device and robot|
US20160295133A1|2015-04-06|2016-10-06|Heptagon Micro Optics Pte. Ltd.|Cameras having a rgb-ir channel|US9992477B2|2015-09-24|2018-06-05|Ouster, Inc.|Optical system for collecting distance information within a field|
AU2017315762B2|2016-08-24|2020-04-09|Ouster, Inc.|Optical system for collecting distance information within a field|
JP2020521954A|2017-05-15|2020-07-27|アウスター インコーポレイテッド|Optical imaging transmitter with enhanced brightness|
US11016192B2|2017-07-05|2021-05-25|Ouster, Inc.|Light ranging device with MEMS scanned emitter array and synchronized electronically scanned sensor array|
US10739189B2|2018-08-09|2020-08-11|Ouster, Inc.|Multispectral ranging/imaging sensor arrays and systems|
US20200116830A1|2018-08-09|2020-04-16|Ouster, Inc.|Channel-specific micro-optics for optical arrays|
法律状态:
2019-05-16| FG| Patent granted|Effective date: 20190409 |
优先权:
申请号 | 申请日 | 专利标题
EP16207630.1|2016-12-30|
EP16207630.1A|EP3343246A1|2016-12-30|2016-12-30|System for characterizing surroundings of a vehicle|
[返回顶部]